Bayes’s Theorem And Weighing Evidence by Juries
نویسنده
چکیده
At first sight, there may appear to be little connection between Statistics and Law. On closer inspection it can be seen that the problems they tackle are in many ways identical — although they go about them in very different ways. In a broad sense, each subject can be regarded as concerned with the Interpretation of Evidence. I owe my own introduction to the common ground between the two activities to my colleague William Twining, Professor of Jurisprudence at University College London, who has long been interested in probability in the law. In our discussions we quickly came to realise that, for both of us, the principal objective in teaching our students was the same: to train them to be able to interpret a mixed mass of evidence. That contact led to my contributing some lectures on uses and abuses of probability and statistics in the law to the University of London Intercollegiate LlM course on Evidence and Proof (and an Appendix on “Probability and Proof” to Anderson and Twining (1991)), as well as drawing me into related research (Dawid 1987; Dawid 1994; Dawid and Mortera 1996; Dawid and Mortera 1998). To my initial surprise, I found here a rich and stimulating source of problems, simultaneously practical and philosophical, to challenge my logical and analytical problem-solving skills. For general background on some of the issues involved, see Eggleston (1983); Robertson and Vignaux (1995); Aitken (1995); Evett and Weir (1998); Gastwirth (2000). The current state of legal analysis of evidence seems to me similar to that of science before Galileo, in thrall to the authority of Aristotle and loth to concede the need to break away from old habits of thought. Galileo had the revolutionary idea that scientists should actually look at how the world behaves. It may be equally revolutionary to suggest that lawyers might look at how others have approached the problem of interpretation of evidence, and that they might even have something to learn from them. It is my strong belief (though I do not
منابع مشابه
On Bayes’ Theorem for Improper Mixtures1 by Peter Mccullagh
Although Bayes’s theorem demands a prior that is a probability distribution on the parameter space, the calculus associated with Bayes’s theorem sometimes generates sensible procedures from improper priors, Pitman’s estimator being a good example. However, improper priors may also lead to Bayes procedures that are paradoxical or otherwise unsatisfactory, prompting some authors to insist that al...
متن کاملGeneralizing the Standard Product Rule of Probability Theory and Bayes ’ s Theorem By Arnold
In this paper the usual product rule of probability theory is generalized by relaxing the assumption that elements of sets are equally likely to be drawn. The need for such a generalization has been noted by Jeffreys (1998, pp. 24-25), among others, in his work on an axiom system for scientific learning from data utilizing Bayes’s Theorem. It is shown that by allowing probabilities of elements ...
متن کاملOn Bayes ’ S Theorem for Improper Mixtures
Although Bayes’s theorem demands a prior that is a probability distribution on the parameter space, the calculus associated with Bayes’s theorem sometimes generates sensible procedures from improper priors, Pitman’s estimator being a good example. However, improper priors may also lead to Bayes procedures that are paradoxical or otherwise unsatisfactory, prompting some authors to insist that al...
متن کاملA note on Bahadur's expansion in Bayesian diagnostic algorithms.
Scheinok’s (1972) empirical results, obtained from using Bahadur’s expansion in Bayes’s theorem, are explained by noting that the expansion is an exact representation of observed probabilities and thus no information was gained by its use. The calculated and observed joint probability distributions will always be equal. It is also demonstrated that posterior probabilities equal to the ratio of ...
متن کاملSymbolic Bayesian Inference
Bayesian inference, of posterior knowledge based on prior knowledge and observed evidence, is typically implemented by applying Bayes’s theorem, solving an equation in which the posterior multiplied by the probability of an observation equals a joint probability. But when we observe a value of a continuous variable, the observation usually has probability zero, and Bayes’s theorem says only tha...
متن کامل